Click here to read our latest report “30 Years of Trends in Terrorist and Extremist Games”

The Persistence of QAnon on Mainstream Social Media

The Persistence of QAnon on Mainstream Social Media
20th October 2021 Dr. Gerard Gill
In Insights

QAnon is a far-right conspiracy theory and cultic social movement that needs little introduction. Just some of the harms it has been behind include an armed standoff on the Hoover Dam, various child kidnapping plots, elements of the January 6 insurrection, and a number of murders. Recently, a father killed his two children after QAnon theories convinced him that it would save the world from monsters.

Eventually, after the January 6 insurrection, tech companies took action with widespread banning and removal of far-right content and accounts. QAnon took a major hit on mainstream platforms such as Twitter and Facebook. While such measures are not a panacea for the problem of online extremism and radicalisation, they make a notable difference. The volume of QAnon content on these platforms dropped considerably. And while violent extremist groups can persist on platforms such as Gab and Telegram, their ability to reach the uninitiated is curtailed, curbing their growth.

However, the banning of groups such as QAnon is difficult to enforce and maintain. This Insight will explore the extent to which QAnon has managed to persist on Facebook, Twitter, Instagram and TikTok. Unsurprisingly, QAnon content can still be found to varying degrees on all of these platforms, but none more than TikTok – indeed a significant portion of the content found elsewhere originated on TikTok.

The platforms in question had accounted for a number of common QAnon terms and phrases in their moderation, but not all. On Facebook, searches for ‘WWG1WGA’. ‘Trust The Plan’, or other such commonly known references yield no result and a content warning. In my limited search, one major public group was identified sharing QAnon content – a ‘Save Our Children’ group with around 2000 followers. Content within the group includes reams of purported documentation of a UK-based adrenochrome harvesting operation, posts about shooting pedophiles, and a suggestion that the Evergiven Suez Canal crisis was connected to the trafficking of children. Various TikTok videos are shared, again about the smuggling of children but also one claiming Sandra Bullock uses baby blood to retain her youthful appearance. Along with this large public group there are a proliferation of private groups, mostly small, but one entitled ‘AWAKE AND UNITED: Resist The Deep State!’ boasts 2,600 members.

Accounts associated with QAnon on Twitter can be easily identified in several fairly obvious ways, including a simple search for #WWG1WGA. Most of these accounts are small, with follower numbers in the tens or hundreds. Among the claims made by these accounts, one asserts that the Queen of England has been executed and replaced by an actor. One account asks its followers, ‘If I told you that McDonalds use 25% of human meat in their burgers would you believe me?’ Another despairs at the state of the world and the evil witnessed by the awakened anons, predicting that some of humanity will not understand and be left behind after the great awakening. Again, TikTok videos are shared often, with one outlining supposed evidence of a conspiracy regarding the origins of COVID-19, involving George Soros, Bill Gates, other various public figures, and the (fictional) GlaxoSmithKline lab in Wuhan, China.

The vast majority of QAnon Instagram content identified in this small exploratory study was not original, but re-posted from other platforms – many from Twitter but most from TikTok. The presence of shared TikTok content in QAnon spaces on Facebook, Twitter, and Instagram suggests that QAnon is flourishing in a relative sense on that platform, which is confirmed with a simple search. A cursory browse reveals the following content:

  • Claims that smart cities initiatives are pretext for widespread surveillance;
  • Suggestion that the Nicholas Cage film ‘Knowing’ is somehow prescient of current events;
  • Musings comparing the world to a ‘zombie apocalypse’, where ‘everyone is just bodies walking around’ except one in 25 people who are awake and aware;
  • Outlandish fashion in Hollywood being presented as evidence of a cult;
  • Claims that COVID-19 and US dock worker shortages have been engineered to destroy the economy;
  • A woman’s story about how she was ‘pilled’ with the truth about adrenochrome and demons who rule the world;
  • January 6 insurrection footage with a rousing AC/DC soundtrack;
  • Claims that Hollywood elites drink blood.

The circulation of this wealth of QAnon TikTok content on other platforms where such content is less prevalent is consistent with recent research by the Anti-Defamation League, which found links to Gab, an infamous source of hate and misinformation, widely disseminated on Twitter. Unlike with that case, however, TikTok is not suggested to be inherently toxic or the appropriate subject of a blanket ban. It has been suggested that TikTok finds itself severely lagging behind more established platforms in terms of how to deal with extremist content. This is a situation requiring urgent remedy as it is one of the most popular applications in the world, with a young and potentially impressionable audience.